k-NN Regression Adapts to Local Intrinsic Dimension

نویسنده

  • Samory Kpotufe
چکیده

Many nonparametric regressors were recently shown to converge at rates that depend only on the intrinsic dimension of data. These regressors thus escape the curse of dimension when high-dimensional data has low intrinsic dimension (e.g. a manifold). We show that k-NN regression is also adaptive to intrinsic dimension. In particular our rates are local to a query x and depend only on the way masses of balls centered at x vary with radius. Furthermore, we show a simple way to choose k = k(x) locally at any x so as to nearly achieve the minimax rate at x in terms of the unknown intrinsic dimension in the vicinity of x. We also establish that the minimax rate does not depend on a particular choice of metric space or distribution, but rather that this minimax rate holds for any metric space and doubling measure.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Rates of Uniform Consistency for k-NN Regression

We derive high-probability finite-sample uniform rates of consistency for k-NN regression that are optimal up to logarithmic factors under mild assumptions. We moreover show that kNN regression adapts to an unknown lower intrinsic dimension automatically. We then apply the k-NN regression rates to establish new results about estimating the level sets and global maxima of a function from noisy o...

متن کامل

k-NN Regression on Functional Data with Incomplete Observations

In this paper we study a general version of regression where each covariate itself is a functional data such as distributions or functions. In real applications, however, typically we do not have direct access to such data; instead only some noisy estimates of the true covariate functions/distributions are available to us. For example, when each covariate is a distribution, then we might not be...

متن کامل

Adaptivity to Local Smoothness and Dimension in Kernel Regression

We present the first result for kernel regression where the procedure adapts locally at a point x to both the unknown local dimension of the metric space X and the unknown Hölder-continuity of the regression function at x. The result holds with high probability simultaneously at all points x in a general metric space X of unknown structure.

متن کامل

Fast, smooth and adaptive regression in metric spaces

It was recently shown that certain nonparametric regressors can escape the curse of dimensionality when the intrinsic dimension of data is low ([1, 2]). We prove some stronger results in more general settings. In particular, we consider a regressor which, by combining aspects of both tree-based regression and kernel regression, adapts to intrinsic dimension, operates on general metrics, yields ...

متن کامل

Learning to Catch: Applying Nearest Neighbor Algorithms to Dynamic Control Tasks

This paper examines the hypothesis that local weighted variants of k-nearest neighbor algorithms can support dynamic control tasks. We evaluated several k-nearest neighbor (k-NN) algorithms on the simulated learning task of catching a ying ball. Previously, local regression algorithms have been advocated for this class of problems. These algorithms, which are variants of k-NN, base their predic...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011